Asynchronous Subgradient-Push

نویسندگان

  • Mahmoud Assran
  • Michael Rabbat
چکیده

We consider a multi-agent framework for distributed optimization where each agent in the network has access to a local convex function and the collective goal is to achieve consensus on the parameters that minimize the sum of the agents’ local functions. We propose an algorithm wherein each agent operates asynchronously and independently of the other agents in the network. When the local functions are strongly-convex with Lipschitz-continuous gradients, we show that a subsequence of the iterates at each agent converges to a neighbourhood of the global minimum, where the size of the neighbourhood depends on the degree of asynchrony in the multi-agent network. When the agents work at the same rate, convergence to the global minimizer is achieved. Numerical experiments demonstrate that Asynchronous Subgradient-Push can minimize the global objective faster than state-of-the-art synchronous first-order methods, is more robust to failing or stalling agents, and scales better with the network size.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

M1, M2, ..., Mk/G1, G2,..., Gk/l/N Queue with Buffer Division and Push-Out Schemes for ATM Networks (RESEARCH NOTE)

In this paper, loss probabilities and steady state probabilities of data packets for an asynchronous transfer mode (ATM) network are investigated under the buffer division and push-out schemes. Data packets are classified in classes k which arrive in Poisson fashion to the service facility and are served with general service rate under buffer division scheme, finite buffer space N is divided in...

متن کامل

Privacy Preservation in Distributed Subgradient Optimization Algorithms

In this paper, some privacy-preserving features for distributed subgradient optimization algorithms are considered. Most of the existing distributed algorithms focus mainly on the algorithm design and convergence analysis, but not the protection of agents' privacy. Privacy is becoming an increasingly important issue in applications involving sensitive information. In this paper, we first show t...

متن کامل

ExtraPush for Convex Smooth Decentralized Optimization over Directed Networks

In this note, we extend the existing algorithms Extra [13] and subgradient-push [10] to a new algorithm ExtraPush for convex consensus optimization over a directed network. When the network is stationary, we propose a simplified algorithm called Normalized ExtraPush. These algorithms use a fixed step size like in Extra and accept the column-stochastic mixing matrices like in subgradient-push. W...

متن کامل

Incremental Aggregated Proximal and Augmented Lagrangian Algorithms

We consider minimization of the sum of a large number of convex functions, and we propose an incremental aggregated version of the proximal algorithm, which bears similarity to the incremental aggregated gradient and subgradient methods that have received a lot of recent attention. Under cost function differentiability and strong convexity assumptions, we show linear convergence for a sufficien...

متن کامل

Asynchronous Subgradient Optimization in Noisy Networks

We consider a gossip type of subgradient optimization that can be applied to noisy networks, where the communication links between nodes are noisy. Each node in the network has a function that is not known to all the other nodes and the goal is to cooperatively minimize the sum of all the functions in the network. Under noisy environment, we show that with our distributed optimization algorithm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018